In the heat of passion, you fired off a nasty email or text message to a co-worker or family member and regretted it as soon as you hit “send.”
After an infuriating argument at the cable company office over a billing error, you found yourself driving more aggressively than normal.
These aren’t unusual situations. In fact, they’re incredibly common.
Someday, technology could prevent them from happening…
Computers are learning to understand emotions.
I’ve written about emotional and behavioral analytics a few times in recent months, but it’s been in slightly different fields.
First, I covered Facebook’s nefarious experiment with emotional manipulation by controlling what information users got in their feed.
Then I talked about facial expression analysis in the wearable age from the companies Affectiva, Emotient, and Sension. These companies are working with visual data such as photographs and live video feeds to create intelligent systems that can ascertain the emotions of the people in the picture.
Now we’re into something slightly different: reading emotions from behavioral context.
The Nasty Email
That hastily sent angry email could be a thing of the past thanks to the world’s most famous piece of artificial intelligence, Watson.
After IBM’s (NYSE: IBM) natural language-processing system won the TV game show Jeopardy, it was turned into a billion-dollar commercial business unit. The cognitive power of the Watson system is now available for use in everyday trade.
One of the new features is called Watson Tone Analyzer. As the name suggests, the tool analyzes the content of a text and deduces the overall “tone” of the piece. It can be used to read through documents, emails, presentations, and marketing text and analyze the linguistic changes that can be made to improve the impact of the document.
That negative email, for example, could be analyzed and turned into a friendlier, more productive document that doesn’t cause any unnecessary friction between colleagues.
This very section of text was analyzed by the free Tone Analyzer tool on IBM’s demonstration site. As you can see, it highlights the important words, and comes to a conclusion at the overall tone.
The picture illustrates that it has a “social” tone with conscientious, open, and analytical words. I don’t think it requires any immediate tone correction.
This tool is based on psycholinguistic theory, and it breaks down every word into a list of nine possible synonyms, hypernyms, and hyponyms, measuring the connections they have to the surrounding words. Based on this information, it offers users suggestions on how to strengthen or soften their message.
This tool isn’t ready for production use, but it’s a peek at what a robust artificial intelligence can do for interpersonal communications.
The Best Free Investment You’ll Ever Make
Join Wealth Daily today for FREE. We’ll keep you on top of all the hottest investment ideas before they hit Wall Street. When you become a member today, you’ll get our latest free report: “The Nvidia Killer: Unlocking the $100 Trillion AI Boom.”
It contains the most promising AI companies and sectors poised for explosive growth. Our team of expert analysts has conducted thorough market research to uncover a hidden gem currently trading at just $2.
After getting your report, you’ll begin receiving the Wealth Daily e-Letter, delivered to your inbox daily.
The Aggressive Driver
A research project from the Affective Computing Group at MIT’s Media Lab called AutoEmotive corrects for human emotion in driving to prevent aggressive or irresponsible driving as a result of the driver’s emotional state.
The idea is that a car will be equipped with devices that can approximate the driver’s stress level. This wouldn’t require too many changes to the car itself. Information like speed, acceleration and braking, GPS position, and voice interactions with the car’s onboard voice command system could all supply adequate data.
In fact, just the GPS data alone could be enough to indicate state of mind. A study published this year from Northwestern University showed how GPS and usage sensors in a smartphone can show signs of user depression with 86.5% accuracy.
In the AutoEmotive design, however, more information is collected. Electrode contacts on the steering wheel measure the driver’s temperature, grip pressure, and electrodermal activity, while a camera measures facial expression, breathing, and heart rate.
When these systems detect abnormal amounts of stress, the system could turn on relaxing music, drop the temperature in the cabin, change the voice of the navigation computer to something soothing, exhibit stress measurements on the dashboard, correct the car’s headlights for potential tunnel vision, or even change the color of the car with thermochromatic paint to warn other drivers and pedestrians that the driver is under stress.
Though we often don’t understand our own emotions, they are incredibly predictable. This is because they’re deeply linked to our brains, our glands, our stomachs, and our immune systems. When something changes in one of those systems, emotional changes are sure to follow, and those changes show up in everything we do.
And that’s going to be fundamental data for computing in the near future.
Good Investing,
Tim Conneally
For the last seven years, Tim Conneally has covered the world of mobile and wireless technology, enterprise software, network hardware, and next generation consumer technology. Tim has previously written for long-running software news outlet Betanews and for financial media powerhouse Forbes.